Title: Green cloud computing schemes based on networks: a survey
Authors: N Xiong , W Han A Vandenberg
Abstract: The authors are particularly aware that green cloud computing (GCC) is a broad range and a hot ?eld. The distinction between ?consumer of? and ?provider of? cloud-based energy resources may very important in creating a world-wide ecosystem of GCC. A user simply submits its service request to the cloud service provider with the connection of Internet or wired/wireless networks. The result of the requested service is delivered back to the user in time, whereas the information storage and process, interoperating protocols, service composition, communications and distributed computing, are all smoothly interactive by the networks. In this study, this is a survey on GCC schemes based on networks. The concept and history of Green computing were introduced ?rst, and then focus on the challenge and requirement of cloud computing. Cloud computing needs to become green, which means provisioning cloud service while considering energy consumption under a set of energy consumption criteria and it is called GCC. Furthermore, the recent work done in GCC based on networks, including microprocessors, task scheduling algorithms, virtualisation technology, cooling systems, networks and disk storage were introduced. After that, the works on GCC from their research group was presented in Georgia State University. Finally, the conclusion and some future works were given.
Title: Design of Command, Data and Telemetry HandlingSystem for a Distributed Computing Architecture CubeSat
Authors: Sharan A Asundi Norman G FitzCoy
Abstract: Among the size, weight and power constraints im-posed by the CubeSat specification, the limitation associatedwith power can be addressed through a distributed computingarchitecture. This paper describes such a distributed com-puting architecture and its operational design in the form ofcommand and data handling system and telemetry formulation,adapted for a CubeSat whose power requirements for provingthe mission are significantly larger than the on-orbit averagepower generated. The 1U CubeSat with the mission objective ofprecision three axes attitude control is composed of a low powerflight computer and a high power, high speed auxiliary processor(CMG controller), along with a high capacity battery. Theprecision sensors, actuators and complex computing algorithms,are interfaced and implemented on the high speed auxiliaryprocessor, which is operated intermittently. Health monitoringsensors, transceiver and other housekeeping tasks are interfacedand implemented on the flight computer, which is in continuousoperation. To facilitate effective operation and telemetry pack-aging, each computing unit is designed to host a storage device.The flight software, designed as operating modes, is distributedacross the two computing platforms. Distributed operationsare initiated through the flight computer and executed on theauxiliary processor. The paper describes in detail the distributeddesign of these operating modes as flowcharts and the associatedtelemetry budget as tables.
Title: Quantum Cryptography: A New Generation of Information Technology Security System
Authors: Sharbaf, M.S.
Abstract: Quantum cryptography is an emerging technology in which two parties can secure network communications by applying the phenomena of quantum physics. The security of these transmissions is based on the inviolability of the laws of quantum mechanics. Quantum cryptography was born in the early seventies when Steven Wiesner wrote _bstagConjugate Coding_bstag, which took more than ten years to end this paper. The quantum cryptography relies on two important elements of quantum mechanics - the Heisenberg Uncertainty principle and the principle of photon polarization. The Heisenberg Uncertainty principle states that, it is not possible to measure the quantum state of any system without distributing that system. The principle of photon polarization states that, an eavesdropper can not copy unknown qubits i.e. unknown quantum states, due to no-cloning theorem which was first presented by Wootters and Zurek in 1982. This research paper concentrates on the theory of quantum cryptography, and how this technology contributes to the network security. This research paper summarizes the current state of quantum cryptography, and the real-world application implementation of this technology, and finally the future direction in which the quantum cryptography is headed forwards.
Abstract: In this paper we have proposed Noise Induced HSI model based noisy and blurred colour image segmentation technique. This approach uses additive noise to suppress the effect of internal noise present in an image for proper detection of objects from such images. In this algorithm we decompose a given image in Hue, Saturation and Intensity (HSI) components and then apply processing on intensity component of the decomposed image. We measured performance of proposed algorithm in terms of correlation coefficient and number of mismatch pixels. The effectiveness of the proposed algorithm is compared with the different existing techniques. It is observed that the computational complexity of our algorithm is less in comparison with several existing techniques, because it deals only with intensity component of the decomposed image. Furthermore, an additional advantage, our technique of segmentation gives better performance as compared to SSR based segmentation using RGB model, SR-extended, integrated region matching, watershed and marker controlled watershed based segmentation method.
Title: Automatic Abstract Service Generation from Web Service Communities
Authors: Xumin Liu,Hua Liu
Abstract: The concept of abstract services has been widelyadopted in service computing to specify the functionality ofcertain types of Web services. It significantly benefits keyservice management tasks, such as service discovery andcomposition, as these tasks can be first applied to a smallnumber of abstract services and then mapped to the large scaleactual services. However, how to generate abstract services isnon-trivial. Current approaches either assume the existenceof abstract services or adopt a manual process that demandsintensive human intervention. We propose a novel approach tofully automate the generation of abstract services from a servicecommunity that consists of a set of functionally similar services.A set of candidate outputs are first discovered based on prede-fined support ratio, which determines the minimum number ofservices that produce the outputs. Then, the matching inputsare identified to form the abstract services. We propose a setof heuristics to effectively prune a large number of candidateabstract services. An comprehensive experimental study onreal world web service data is conducted to demonstrate theeffectiveness and efficiency of the proposed approach.
Title: Tracking On-line Radicalization Using Investigative Data Mining
Authors: Pooja Wadhwa, MPS Bhatia
Abstract: The increasing complexity and emergence of Web 2.0 applications have paved way for threats arising out of the use of social networks by cyber extremists (Radical groups). Radicalization (also called cyber extremism and cyber hate propaganda) is a growing concern to the society and also of great pertinence to governments & law enforcement agencies all across the world. Further, the dynamism of these groups adds another level of complexity in the domain, as with time, one may witness a change in members of the group and hence has motivated many researchers towards this field. This proposal presents an investigative data mining approach for detecting the dynamic behavior of these radical groups in online social networks by textual analysis of the messages posted by the members of these groups along with the application of techniques used in social network analysis. Some of the preliminary results obtained through partial implementation of the approach are also discussed.
Title: Query-Specific Visual Semantic Spaces for Web Image Re-ranking
Authors: Xiaogang Wang,Ke Liu,Xiaoou Tang,
Abstract: Image re-ranking, as an effective way to improve the re-sults of web-based image search, has been adopted by cur-rent commercial search engines. Given a query keyword, apool of images are first retrieved by the search engine basedon textual information. By asking the user to select a queryimage from the pool, the remaining images are re-rankedbased on their visual similarities with the query image. Amajor challenge is that the similarities of visual features donot well correlate with images_ semantic meanings whichinterpret users_ search intention. On the other hand, learn-ing a universal visual semantic space to characterize highlydiverse images from the web is difficult and inefficient.In this paper, we propose a novel image re-rankingframework, which automatically offline learns different vi-sual semantic spaces for different query keywords throughkeyword expansions. The visual features of images are pro-jected into their related visual semantic spaces to get se-mantic signatures. At the online stage, images are re-rankedby comparing their semantic signatures obtained from thevisual semantic space specified by the query keyword. Thenew approach significantly improves both the accuracy andefficiency of image re-ranking. The original visual featuresof thousands of dimensions can be projected to the seman-tic signatures as short as 25 dimensions. Experimental re-sults show that 20% - 35% relative improvement has beenachieved on re-ranking precisions compared with the state-of-the-art methods.
Abstract: The use of camera as a biometric sensor is desir-able due to its ubiquity and low cost, especially for mo-bile devices. Palmprint is an effective modality in suchcases due to its discrimination power, ease of presen-tation and the scale and size of texture for capture bycommodity cameras. However, the unconstrained na-ture of pose and lighting introduces several challengesin the recognition process. Even minor changes in poseof the palm can induce significant changes in the vis-ibility of the lines. We turn this property to our ad-vantage by capturing a short video, where the naturalpalm motion induces minor pose variations, providingadditional texture information. We propose a method toregister multiple frames of the video without requiringcorrespondence, while being efficient. Experimental re-sults on a set of different 100 palms show that the use ofmultiple frames reduces the error rate from 12.75% to4.7%. We also propose a method for detection of poorquality samples due to specularities and motion blur,which further reduces the EER to 1.8%.
Title: Agile and Efficient MIMO System for Smart Phone Terminals
Authors: Osama N Alrabad, Elpiniki P Tsakalaki, Mauro Pelosi, Gert F Pedersen
Abstract: The paper proposes a novel multi-input multi-output (MIMO) system architecture that covers most of the high L TE bands. The system is comprised of four small loop antennas. Each antenna has two ports, one for communication and one for control. The control port is used for tuning the loop antenna where impressive frequency agility using a single capacitor is obtained. A good level of inherent isolation among the four loop antennas is maintained over the different frequency bands. The MIMO performance of the proposed system is evaluated through its spectral efficiency versus frequency. Finally, the information bandwidth of the MIMO system can be defined by comparing its spectral efficiency against the spectral efficiency of three ideal MIMO antennas.
Title: Interface-Based Object-Oriented Design with Mock Objects
Authors: Nandigam, J.; Gudivada, V.N.; Hamou-Lhadj, A.; Yonglei Tao
Abstract: Interfaces are fundamental in object-oriented systems. One of the principles of reusable object-oriented design, according to Gamma et al., is program to an interface, not an implementation. Interface-based systems display three key characteristics - flexibility, extensibility, and pluggability. Designing with interfaces is therefore a better way of building object-oriented systems. Getting students in introductory software engineering and design courses to program to interfaces and develop interface-based systems is a challenge. This paper presents our experiences with the use of mock objects to promote interface-based design and effective unit testing in software engineering and design courses.
Title: Low Complexity Routing Algorithm for Rearrangeable Switching Networks
Authors: Amitabha Chakrabarty Martin Collier
Abstract: Rearrangeable switching networks have been a topicof interest for a long time among the research community.Routing algorithms for this class of networks have attractedlots of researchers along with other related areas, such asmodiï¬cation of the networks structure, crosspoints reduction etc.In this paper a new routing algorithm is presented for symmetricrearrangeable networks built with 2 × 2 switching element. Anew matrix based abstraction model is derived to determineconflict free routing paths for each input-output request. Eachstage of a network is mapped into a set of sub-matrices andnumber of matrices in each stage correspond to number ofsubnetworks in that stage. Once the input permutation is given,matrix cells are populated with binary values depending onthe position of the switching elements in the actual hardwareand their mapped matrix cells. These binary values control therouting decision in the underlying hardware. This new routingalgorithm is capable of connection setup for partial permutation,m = ÏN , where N is the total input numbers and m is thenumber of active inputs. Overall the serial time complexity ofthis method is O(N logN)1 and O(mlogN) where all N inputsare active and with m < N active inputs respectively. Thetime complexity of this routing algorithm in a parallel machinewith N completely connected processors is O(log2N). With mactive requests the time complexity goes down to O(logmlogN),which is better than the O(log2m + logN), reported in theliterature for 212[(log2Nâˆ_4logN)12âˆ_logN ] ≤ Ï â‰¤ 1. Later half ofthis paper demonstrates how this routing algorithm is applicablefor crosstalk free routing in optical domain.
Title: Noise Robust Voice Detector for Speaker Recognition
Authors: Gabriel Hernandez, Jose R Calvo, Rafael Fernandez, Ivis Rodes and Rafael Martnez
Abstract: The effect of additive noise in a speaker recognition system is well known to be a crucial problem in real life applications. In a speaker recognition system, if the test utterance is corrupted by any type of noise, the performance of the system notoriously degrades. The use of a noise robust voice detector to determine which audio frame is a voice frame is proposed in this work. The new detector is implemented using an Ada-Boosting algorithm in a voiced-unvoiced sound classifier based on speech waveform only. Results reflect better performance of robust speaker recognition based on selected voice segments, respects to unselected segments, in front to additive white noise.
Title: Accelerating Pollard’s Rho Algorithm on Finite Fields
Authors: Jung Hee Cheon, Jin Hong , and Minkyu Kim
Abstract: Most generic and memory efficient algorithms for solving the discrete logarithm problem construct a certain random graph consisting of group element nodes and return the solution when a collision is found among the graph nodes. In this work, we develop a technique for traveling through the random graph with- out fully computing each node and also provide an extension to the distinguished point collision detection method that is suitable for this new situation. Concrete constructions of this technique for multiplicative subgroups of the finite fields are given. Our implementations confirm that the proposed technique provides practical speedup over existing algorithms.
Publish Year: 2012
Publisher: Journal of Cryptography - Journal of Springer
Title: Bricolage: Example-Based Retargeting for Web Design
Authors: Ranjitha Kumar Jerry O Talton Salman Ahmad Scott R Klemmer
Abstract: The Web provides a corpus of design examples unparalleled in human history. However, leveraging existing designs to produce new pages is often difficult. This paper introduces the Bricolage algorithm for transferring design and content between Web pages. Bricolage employs a novel, structured-prediction technique that learns to create coherent mappings between pages by training on human-generated exemplars. The produced mappings are then used to automatically transfer the content from one page into the style and layout of an-other. We show that Bricolage can learn to accurately repro-duce human page mappings, and that it provides a general, efficient, and automatic technique for retargeting content between a variety of real Web pages.
Title: Secret Sharing Scheme Suitable for Cloud Computing
Authors: Satoshi Takahashi Keiichi Iwamura
Abstract: Secret sharing schemes have recently been considered to apply for cloud computing in which many users distribute multiple data to servers. However, when Shamir s(k,n) secret sharing is applied to cloud systems, the amount of share increases more than n times the amount of the secret. Therefore, in this paper we propose a new secret sharing scheme that can reduce the amount of share different from Ramp type secret sharing, suitable for cloud systems, and we prove that it is computationally secure.
Abstract: Knowledge mobilisation is a transition from theprevailing knowledge management technology that has beenwidely used in industry for the last 20 years to a new meth-odology and some innovative methods for knowledge rep-resentation, formation and development and for knowledgeretrieval and distribution. Knowledge mobilisation aims atcoming to terms with some of the problems of knowledgemanagement and at the same time to introduce new theory,new methods and new technology. More precisely, this paperpresents an outline of a fuzzy ontology as an enhanced ver-sion of classical ontology and demonstrates some advantagesfor practical decision making. We show that a number of softcomputing techniques, e.g. aggregation functions and inter-val valued fuzzy numbers, will support effective and practicaldecision making on the basis of the fuzzy ontology. Wedemonstrate the knowledge mobilisation methods with theconstruction of a support system for finding the best availablewine for a number of wine drinking occasions using a fuzzywine ontology and fuzzy reasoning methods; the supportsystem has been implemented for a Nokia N900 smart phone.
Title: Mission Assurance Increased with Regression Testing
Authors: Roland Lang
Abstract: Knowing what to test is an important attribute in any testing campaign, especially when it has to be right or the mission could be in jeopardy. The New Horizons mission, developed and operated by the John Hopkins University Applied Physics Laboratory, received a planned major upgrade to their Mission Operations and Control (MOC) ground system architecture. Early in the mission planning it was recognized that the ground system platform would require an upgrade to assure continued support of technology used for spacecraft operations. With the planned update to the six year operational ground architecture from Solaris 8 to Solaris 10, it was critical that the new architecture maintain critical operations and control functions. The New Horizons spacecraft is heading to its historic rendezvous with Pluto in July 2015 and then proceeding into the Kuiper Belt. This paper discusses the Independent Software Acceptance Testing (ISAT) Regression test campaign that played a critical role to assure the continued success of the New Horizons mission. The New Horizons ISAT process was designed to assure all the requirements were being met for the ground software functions developed to support the mission objectives. The ISAT team developed a test plan with a series of test case designs. The test objectives were to verify that the software developed from the requirements functioned as expected in the operational environment. As the test cases were developed and executed, a regression test suite was identified at the functional level. This regression test suite would serve as a crucial resource in assuring the operational system continued to function as required with such a large scale change being introduced. Some of the New Horizons ground software changes required modifications to the most critical functions of the operational software. Of particular concern was the new MOC architecture (Solaris 10) is Intel based and little end ian, and the legacy architecture (Solaris 8) was SPARC based and big end ian. The presence of byte swap issues that might not have been identified in the required software changes was very real and can be difficult to find. The ability to have test designs that would exercise all major functions and operations was invaluable to assure that critical operations and tools would operate as they had since first operational use. With the longevity of the mission also came the realization that the original ISAT team would not be the people working on the ISAT regression testing. The ability to have access to all original test designs and test results identified in the regression test suite greatly improved the ability to identify not only the expected system behavior, but also the actual behavior with the old architecture.
Publish Year: 2013
Published in: Aerospace - IEEE
Number of Pages: 8
موضوع: مهندسی نرم افزار، تست نرم افزار، تست مجتمع سازی
Title: Numerical simulation and optimization of CO2 sequestration in saline aquifers
Authors: Zheming Zhang, Ramesh Agarwal
Abstract: With heightened concerns on CO2 emissions from coal fired electricity generation plants, there has been major emphasis in recent years on the development of safe and economical Carbon Dioxide Capture and Sequestration (CCS) technology worldwide. Saline reservoirs are attractive geological sites for CO2 sequestration because of their huge capacity for long term sequestration. Over the last decade, numerical simulation codes have been developed in US, Europe and Japan to determine a priori the CO2 storage capacity of a saline aquifer and to provide risk assessment with reasonable confidence before the actual deployment of CO2 sequestration can proceed with enormous investment. In US, the 2nd version of Transport of Unsaturated Groundwater and Heat (TOUGH2) numerical simulator has been widely used for this purpose. However at present, it does not have the ability to determine optimal parameters such as injection rate, injection pressure, injection depth for vertical and horizontal wells, etc. for optimization of the CO2 storage capacity and for minimizing the leakage potential by confining the plume migration. This paper describes the development of a “Genetic Algorithm (GA)” based optimizer for TOUGH2 that can be used by the industry with good confidence to optimize the CO2 storage capacity in a saline aquifer of interest. This new code including the TOUGH2 and the GA optimizer is designated as “GATOUGH2”. It has been validated by conducting simulations of three widely used benchmark problems by the CCS researchers worldwide: (a) study of CO2 plume evolution and leakage through an abandoned well, (b) study of enhanced CH4 recovery in combination with CO2 storage in depleted gas reservoirs, and (c) study of CO2 injection into a heterogeneous geological formation. The results of these simulations are in excellent agreement with those of other researchers using different codes. The validated code has been employed to optimize the proposed water-alternating-gas (WAG) injection scheme for (a) a vertical CO2 injection well and (b) a horizontal CO2 injection well, in order to optimize the CO2 sequestration capacity of an aquifer. The optimized calculations from GATOUGH2 are compared with the brute force nearly optimized results obtained by performing a large number of calculations. These comparisons demonstrate the significant efficiency and accuracy of GATOUGH2 as an optimizer compared to using TOUGH2 in a brute force manner. This capability holds a great promise in studying a host of other problems in CO2 sequestration such as how to optimally accelerate the capillary trapping, accelerate the dissolution of CO2 in water or brine, and immobilize the CO2 plume.
Publish Year: 2013
Published in: Computers and Fluids - Science Direct
Abstract: In this paper we present a Universal and Interoper-able Ground Control Station (UIGCS) simulator for fixed androtary wing Unmanned Aerial Vehicles (UAVs), and all types ofpayloads. One of the major constraints is to operate and man-age multiple legacy and future UAVs, taking into account thecompliance with NATO Combined/Joint Services OperationalEnvironment (STANAG 4586). Another purpose of the station isto assign the UAV a certain degree of autonomy, via autonomousplanification/replanification strategies. The paper is organizedas follows.In Section 2, we describe the non-linear models of the fixed androtary wing UAVs that we use in the simulator.In Section 3, we describe the simulator architecture, which isbased upon interacting modules programmed independently.This simulator is linked with an open source flight simulator, tosimulate the video flow and the moving target in 3D. To concludethis part, we tackle briefly the problem of the Matlab/Simulinksoftware connection (used to model the UAV_s dynamic) with thesimulation of the virtual environment.Section 5 deals with the control module of a flight path of theUAV. The control system is divided into four distinct hierarchicallayers: flight path, navigation controller, autopilot and flightcontrol surfaces controller.In the Section 6, we focus on the trajectory planifica-tion/replanification question for fixed wing UAV. Indeed, one ofthe goals of this work is to increase the autonomy of the UAV. Wepropose two types of algorithms, based upon 1) the methods ofthe tangent and 2) an original Lyapunov-type method. Thesealgorithms allow either to join a fixed pattern or to track amoving target.Finally, Section 7 presents simulation results obtained on oursimulator, concerning a rather complicated scenario of mission.
Title: The role of the atmosphere in the provision of ecosystem services
Authors: Ellen J. Cooter, Anne Rea, Randy Bruins, Donna Schwede, Robin Dennis
Abstract: Solving the environmental problems that we are facing today requires holistic approaches to analysis and decision making that include social and economic aspects. The concept of ecosystem services, defined as the benefits people obtain from ecosystems, is one potential tool to perform such assessments. The objective of this paper is to demonstrate the need for an integrated approach that explicitly includes the contribution of atmospheric processes and functions to the quantification of air–ecosystem services. First, final and intermediate air–ecosystem services are defined. Next, an ecological production function for clean and clear air is described, and its numerical counterpart (the Community Multiscale Air Quality model) is introduced. An illustrative numerical example is developed that simulates potential changes in air–ecosystem services associated with the conversion of evergreen forest land in Mississippi, Alabama and Georgia to commercial crop land. This one-atmosphere approach captures a broad range of service increases and decreases. Results for the forest to cropland conversion scenario suggest that although such change could lead to increased biomass (food) production services, there could also be coincident, seasonally variable decreases in clean and clear air–ecosystem services (i.e., increased levels of ozone and particulate matter) associated with increased fertilizer application. Metrics that support the quantification of these regional air–ecosystem changes require regional ecosystem production functions that fully integrate biotic as well as abiotic components of terrestrial ecosystems, and do so on finer temporal scales than are used for the assessment of most ecosystem services.
Publish Year: 2013
Published in: Science of The Total Environment - Science Direct
Title: Construction of fuzzy ontologies from fuzzy XML models
Authors: Fu Zhang, Z.M. Ma, Li Yan
Abstract: The success and proliferation of the Semantic Web depends heavily on construction of Web ontologies. However, classical ontology construction approaches are not sufficient for handling imprecise and uncertain information that is commonly found in many application domains. Therefore, great efforts on construction of fuzzy ontologies have been made in recent years. In particular, XML is imposing itself as a standard for representing and exchanging information on the Web, topics related to the modeling of fuzzy data have become very interesting in the XML data context. Therefore, constructing fuzzy ontologies from fuzzy XML data resources may make the existing fuzzy XML data upgrade to Semantic Web contents, and the constructed fuzzy ontologies may be useful for improving some fuzzy XML applications.This paper proposes a formal approach and an automated tool for constructing fuzzy ontologies from fuzzy XML data resources. Firstly, we propose a formal definition of fuzzy XML models (including the document structure fuzzy DTDs and the document content fuzzy XML documents). On this basis, we propose a formal approach for constructing fuzzy ontologies from fuzzy XML models, i.e., transforming a fuzzy XML model (including fuzzy DTD and fuzzy XML document) into a fuzzy ontology. Also, we give the proof of correctness of the construction approach, and provide a detailed construction example. Furthermore, we implement a prototype tool called FXML2FOnto, which can automatically construct fuzzy ontologies from fuzzy XML models. Finally, in order to show that the constructed fuzzy ontologies may be useful for improving some fuzzy XML applications, we focus on investigating how to reason on fuzzy XML models (e.g., conformance, inclusion, and equivalence) based on the constructed fuzzy ontologies, and it turns out that the reasoning tasks of fuzzy XML models can be checked by means of the reasoning mechanism of fuzzy ontologies.
Publish Year: 2013
Published in: Knowledge-Based Systems - Science Direct
Abstract: The Belief Rule Base (BRB) is an expert system which can handle both qualitative and quantitative information. One of the applications of the BRB is the Rule-base Inference Methodology using the Evidential Reasoning approach (RIMER). Using the BRB, RIMER can handle different types of information under uncertainty. However, there is a combinatorial explosion problem when there are too many attributes and/or too many alternatives for each attribute in the BRB. Most current approaches are designed to reduce the number of the alternatives for each attribute, where the rules are derived from physical systems and redundant in numbers. However, these approaches are not applicable when the rules are given by experts and the BRB should not be oversized. A structure learning approach is proposed using Grey Target (GT), Multidimensional Scaling (MDS), Isomap and Principle Component Analysis (PCA) respectively, named as GT–RIMER, MDS–RIMER, Isomap–RIMER and PCA–RIMER. A case is studied to evaluate the overall capability of an Armored System of Systems. The efficiency of the proposed approach is validated by the case study results: the BRB is downsized using any of the four techniques, and PCA–RIMER has shown excellent performance. Furthermore, the robustness of PCA–RIMER is further verified under different conditions with varied number of attributes.
Publish Year: 2013
Published in: Knowledge-Based Systems - Science Direct
Title: Electromagnetic Interference with RFID Readers in Hospitals
Authors: Yue Ying, Dirk Fischer, Uvo Hlscher
Abstract: Radio frequency identification (RFID) applications have become popular in many areas. RFID uses wireless technology to transmit information stored within a RFID-tag to a RFID-reader. Passive RFID-tags are powered through the received signal from the reader. Therefore the reader has to transmit its request with a certain power to bridge the gap between reader and tag. In contrast to bar-code technology this technique allows communication without requiring a line of sight. The RFID-technology is beginning to be introduced in medical environment, e.g. for patient tracking, staff location, inventory management etc. Implementation of this technology in medical environment has to guarantee the patients_ safety. However the electromagnetic field generated by the RFID-readers may interfere with medical devices. This study evaluates the electromagnetic interference of RFID-readers with medical devices, discusses the results with respect to the safety standards and proposes a process to manage the risks associated with RFID-technology in medical environment.
Title: Engineering Mathematics: The Odd Order Theorem Proof
Authors: Georges Gonthier
Abstract: Even with the assistance of computer tools, the formalized description and verification of research-level mathematics remains a daunting task, not least because of the talent with which mathematicians combine diverse theories to achieve their ends. By combining tools and techniques from type theory, language design, and software engineering we have managed to capture enough of these practices to formalize the proof of the Odd Order theorem, a landmark result in Group Theory.
Abstract: This paper presents VLSI implementation of adaptive noise canceller based on least mean square algorithm. First, the adaptive parameters are obtained by simulating noise canceller on MATLAB. Simulink model of adaptive noise canceller was developed and the noise is suppressed to a much larger extent in recovering the original signal. The data such as input and output signals, desired signal, step size factor and coefficients of adaptive filter was processed by FPGA. Finally, the functions of field programmable gate array -based system structure for adaptive noise canceller based on LMS algorithm are synthesized, simulated, and implemented on XilinxXC3s200 field programmable gate array using Xilinx ISE tool. The research results show that it is feasible to implement and use adaptive least mean square filter based adaptive noise canceller design which consumed a low power of 0.156W at29.1° C in a single field programmable gate array chip.
Title: Information Fusion and Discounting Techniques for Decision Support in Aerospace
Authors: Fiona Browne , Yan Jin , Niall Rooney Hui Wang
Abstract: Decision makers are required to make critical decisions throughout all stages of a life-cycle in large-scale projects. These decisions are important as they impact upon the outcome and the success of projects. In this paper we present an evidential reasoning framework to aid decision-makers in the decision - making process. This approach utilizes the Dezert-Smarandache Theory (DSm) to fuse heterogeneous evidence sources that suffer from levels of uncertainty, imprecision and conflicts to provide beliefs for decision options. To analyze the impact that source reliability and priority has upon the decision making process a reliability discounting technique along with a priority discounting technique are applied. Application of the evidential reasoning framework is illustrated using a Case Study based in the Aerospace domain.
Title: Aero-Optical Effects In Free-Space Laser Communications
Authors: Stanislav Gordeyev, Eric Jumper
Abstract: When a laser beam is transmitted from an airborne platform, it must first pass through a relatively thin region of the turbulent flow in the immediate vicinity of the airplane. Unsteady density variations present in the turbulent flow will imprint spatial/temporal variations on the otherwise planar outgoing wavefronts. These variations in wavefronts will force the beam to move, change its shape and even break into several spots on a distant target. These aero-optical effects [1,2], even in the absence of any atmospheric optical distortions, might significantly degrade a performance of any free-space, laser-based airborne systems at subsonic, transonic or supersonic speeds.
Title: A Cognitive Link Adaptation Strategy with Differentiated Responses on Wi-Fi System-on-Chip
Authors: Taeyoung Lee, Myounghwan Lee, Jaeeun Kang, Kyungik Cho, ChilYoul Hacky Yang, and Scott Seongwook Lee
Abstract: The dynamic change of wireless channel condition causes poor Wi-Fi quality of service. To cope with the channel variation of wireless communication, the essential strategy for the physical layer with multi-rate capability is the link adaptation. However, because definite guidelines for rate changes do not exist, most link adaptation algorithms can misleadingly increase transmission rates. Therefore, we propose a cognitive link adaptation scheme, sensing link adaptation urgency and responding with differentiated rate control such as the long-term rate adaptation with a short-term rate drop. To evaluate the proposed scheme, we implement the link adaptation scheme on Wi-Fi System-on-Chip, applying the optimal HW/SW partitioning technique to reduce the SW processing delay from heavy operating calculation for the algorithm. The experiment shows that the proposed algorithm increases the packet arrival rate in a real Wi-Fi environment by reducing transmission failure due to wrong rate changes.
Title: The alpha parallelogram predictor: A lossless compression method for motion capture data
Authors: Pengjie Wang , Zhigeng Pan d,a , Mingmin Zhang a, , Rynson WH Lau c , Haiyu Song
Abstract: Motion capture data in an uncompressed form can be expensive to store, and slow to load and transmit. Current compression methods for motion capture data are primarily lossy and cause distortions in the motion data. In this paper, we present a lossless compression algorithm for motion capture data. First, we propose a novel Alpha Parallelogram Predictor (APP) to estimate the DOF (degree of freedom) of each child joint from those of its immediate neighbors and parents that have already been processed. The prediction parameter of the predictor, which is referred to as the alpha parameter, is adaptively chosen from a care- fully designed lookup table. Second, we divide the predicted and actual values into three components: sign, exponent and mantissa. We then compress their corrections separately with context-based arithmetic coding. Compared with other lossless compression methods, our approach can achieve a higher compression ratio with a comparable compression time. It can be used in situations where lossy compression is not preferred.
Publish Year: 2013
Published in: Information Sciences - Science Direct
Title: Solution blown nanofibrous membrane for microfiltration
Authors: Xupin Zhuang a,b , Lei Shi a , Kaifei Jia a , Bowen Cheng b,n , Weimin Kang
Abstract: Nanofibrous membranes have been paid attention in microfiltration. Solution blowing process is a new nanofiber fabricating method with high productivity. In this study, polyvinylidene fiuoride (PVDF) nanofibrous mat was successfully solution blown using a multiorifices die. The fibers were mostly 60 280 nm in diameter and three-dimensional curly which resulted in a loose construction with high porosity of 95.8%. The nanofibrous mat was further hot-pressed to increase its integrality. The structure and microfiltration performance was evaluated. The results showed the crystallinity of the membranes increased, the porosity and pore size decreased after hot-pressing treatment. The hot-pressed membranes showed high retention ratio against microparticles and high pure water fiux which will help the solution blown membrane find application in high fiux microfiltration.
Publish Year: 2013
Published in: Journal of Membrane Science - Science Direct
Title: Quotients and weakly algebraic sets in pseudoeffect algebras
Authors: HaiYang Li JiGen Peng
Abstract: In the paper, we show that the quotient ½E I of alattice-ordered pseudoeffect algebra E with respect to anormal weak Riesz ideal I is linearly ordered if and only ifI is a prime normal weak Riesz ideal, and ½E I is a repre-sentable pseudo MV-algebra if and only if I is an inter-section of prime normal weak Riesz ideals. Moreover, weintroduce the concept of weakly algebraic sets in pseudo-effect algebras, discuss the characterizations of weaklyalgebraic sets and show that weakly algebraic sets inpseudoeffect algebra E are in a one-to-one correspondencewith normal weak Riesz ideals in pseudoeffect algebra E:
Abstract: Web Services is a technology for building distributed software applications that are built upon a set of information and communication standards. Among those standards is the Web Services Description Language (WSDL) which is an XML-based language for describing service descriptions. Service providers will publish WSDL documents of their Web services so that service consumers can learn about service capability and how to interface with the services. Since WSDL documents are the primary source of service information, readability of WSDL documents is of concern to service providers, i.e. service descriptions should be understood with ease by service consumers. Providing highly readable service descriptions can then be used as a strategy to attract service consumers. However, given highly readable information in the WSDL documents, competitors are able to learn know-how and can copy the design to offer competing services. Security attacks such as information espionage, client impersonation, command injection, and denial of service are also possible since attackers can learn about exchanged data and invocation patterns from WSDL documents. While readability of service descriptions makes Web services discoverable, it contributes to service vulnerability too. Service designers therefore should consider this trade-off when designing service descriptions. Currently there is no readability measurement for WSDL documents. We propose an approach to such measurement so that service designers can determine if readability is too low or too high with regard to service discoverability, service imitation, and service attack issues, and then can consider increasing or lowering service description readability accordingly. Our readability measurement is based on the concepts or terms in service domain knowledge. Given a WSDL document as a service description, readability is defined in terms of the use of difficult words in the description and the use of words that are key concepts in the service domain. As an example, we measure readability of the WSDL document of public Web services, and outline a method to lower or increase readability.
Title: A method for the coating of a polymer inclusion membrane with a monolayer of silver nanoparticles
Authors: Ya Ya N Bonggotgetsakul, Robert W Cattrall, Spas D Kolev
Abstract: The preparation of silver nanoparticles (AgNPs) is described using a polymer inclusion membrane (PIM) consisting of 45% (m/m) di-(2-ethylhexyl)phosphoric acid (D2EHPA) and 55% (m/m) poly(vinyl chloride) (PVC) as a template. The Ag(I) ion was ?rstly extracted into the membrane via cation-exchange and then subsequently reduced with NaBH , trisodium citrate, citric acid, or L-ascorbic acid to form AgNPs. The most effective reducing agent was found to be L-ascorbic acid which at pH 2.0 formed a uniform monolayer of AgNPs of an average size of 360 nm on the surface of the PIM. Citric acid also produced AgNPs but these were embedded in the bulk of the membrane and did not provide a good surface coverage. NaBH and trisodium citrate, on the other hand, gave rise to the formation of black silver oxide on the membrane surface. Factors such as the membrane loading with Ag(I), PIM composition, reduction time, temperature and shaking time were found to have a significant influence on the surface coverage and size of the AgNPs.
Publish Year: 2013
Published in: Journal of Membrane Science - Science Direct
Title: DNA extraction method with improved efficiency and specificity using DNA methyltransferase and click chemistry
Authors: Alexander B Artyukhin , YounHi Woo
Abstract: In an attempt to develop an alternative method to extract DNA from complex samples with much improved sensitivity and efficiency, here we report a proof-of-concept work for a new DNA extraction method using DNA methyltransferase (Mtase) and click chemistry. According to our preliminary data, the method has improved the current methods by (i) employing a DNA-specific enzyme, TaqI DNA Mtase, for improved selectivity, and by (ii) capturing the DNA through covalent bond to the functionalized surface, enabling a broad range of treatments yielding the final sample DNA with minimal loss and higher purity such that it will be highly compatible with downstream analyses. By employing Mtase, a highly DNA specific and efficient enzyme, and click chemistry, we demonstrated that as little as 0.1 fg of k-DNA (close to copy number 1) was captured on silica (Si)-based beads by forming a covalent bond between an azide group on the surface and the propargyl moiety on the DNA. This method holds promise in versatile applications where extraction of minute amounts of DNA plays critical roles such as basic and applied molecular biology research, bioforensic and biosecurity sciences, and state-of-the-art detection methods.
Publish Year: 2012
Published in: Analytical Biochemistry - Journal of Science Direct
Title: The Application of Coal Cleaning Detection System based on Embedded Real-time Image Processing
Authors: Qian Mu ,Jixian Dong
Abstract: This paper introduces a high-speed image processing of embedded systems based on FPGA and DSP collaboration used in coal preparation detection technology. The system uses a modular design, including a real-time image acquisition module, DSP image processing module, and peripheral interface module. The image processing required doing the following arithmetic operations, including image smoothing to enhance, edge detection, region segmentation, gray-scale analysis, pattern recognition. After completion of the above algorithm, the system determines the image feature of coal and gangue, particularly their gray scale values and the center of gravity. By analysis of the characteristics of the image, the system identifies the gangue. The image processing is conducive to the follow-up to discharge refuse treatment.
Title: Business Process based Initial Modeling at Software Development
Authors: J Tick
Abstract: The fundamental problem of high-level quality software development is the elaboration of well-designed and correct model of the developing system. The most critical phase is the “birth” of the model, the specification of the initial model. This paper describes the method to use the already existing Business Process Model to set up the initial model for the software to be developed. Especially Business Process Management systems can be well described by business processes using workflow. This paper makes recommendations how to incorporate workflow in the development process. The basic principle of the method to elaborate the initial model is that using the usually already existing workflow an UML action diagram can be worked out with the help of mapping that can be a useful additional source for further development processes.
Title: Qcompiler: Quantum compilation with the CSD method
Authors: YG Chen, JB Wang
Abstract: In this paper, we present a general quantum computation compiler, which maps any given quantum algorithm to a quantum circuit consisting a sequential set of elementary quantum logic gates based on recursive cosine sine decomposition. The resulting quantum circuit diagram is provided by directly linking the package output written in LaTeX to Qcircuit.tex We illustrate the use of the Qcompiler package through various examples with full details of the derived quantum circuits. Besides its accuracy, generality and simplicity, Qcompiler produces quantum circuits with significantly reduced number of gates when the systems under study have a high degree of symmetry.
Publish Year: 2013
Published in: Computer Physics Communications - Science Direct
Title: Traffic Information Based Power Saving Mechanism
Authors: Ju No Han, Seong Gon Choi
Abstract: This paper proposes the standby power saving mechanism that the user can terminate home network appliance (Le. Personal Computer) power and standby power at the remote site. Generally, for the power management of the home appliance, it has to be put through the physical contact. In the proposed mechanism, the smartphone is offered the power on/off state information of the home network appliance on a real-time basis. The user can transmit the power-termination order to the home network appliance through the smartphone. If the home gateway receives the power-termination order from the smartphone, the power-termination order is sent to the home network appliance and the standby power is blocked. Its power and standby power can be blocked through the method to be proposed even through the user is not physically contacted with the home appliance. That is, the proposed mechanism gives convenience to the user in electric power management aspect. In addition, the power saving can be done, because reducing the unnecessary operation time and blocking the standby power of the home network appliance.
Title: Survey on Middleware Systems in Cloud Computing Integration
Authors: Jonathan AP Marpaung, Mangal Sain, HoonJae Lee
Abstract: Cloud computing is an increasingly attractive solution to providing cheap, scalable, and easy to deploy computing services. This paper presents a survey on the middleware technology used to integrate different cloud platforms/services, applications running in the cloud, and the cloud with on-premise data centers. We discuss the ongoing research projects and solutions being developed such as Altocumulus, AppScale, Cloudify, and mOSAIC. We analyze the features already available in current solutions as well as the future technology and standards needed to ensure seamless integration and more widespread adoption of cloud computing solutions.
Title: The Current Status of Medical Physics in Asia-Oceania
Authors: TS Suh
Abstract: Asia has a diverse cultural and economical background. Asia-Oceania Federation of Organization for Medical Physics (AFOMP) was born in July 2000 during the World Congress 2000 in Chicago. The aim of AFOMP was to provide a solid relationship and useful information for closer collaboration and mutual support among the AFOMP members. The purpose of this presentation is to introduce the status of medical physics in Asia-Oceania providing the information of organization, regional congress, the status of medical physicist, etc. One of major role of AFOMP is to support the development of medical physics in Asia region. To improve the status of medical physicists in Asia, a goal-oriented action plans is required: especially, developing advancement of medical physics in developing countries, strengthening the education, training and professional development of medical physicists, and promoting good relations and the exchange of information with other international related organizations through regional congress and website etc.
تبادل
لینک هوشمند
برای تبادل
لینک ابتدا ما
را با عنوان
دانلود مقاله ISI
و آدرس
downloadpaperisi.LXB.ir
لینک
نمایید سپس
مشخصات لینک
خود را در زیر
نوشته . در صورت
وجود لینک ما در
سایت شما
لینکتان به طور
خودکار در سایت
ما قرار میگیرد.
خبرنامه وب سایت:
آمار
وب سایت:
بازدید امروز : 20
بازدید دیروز : 2
بازدید هفته : 23
بازدید ماه : 41
بازدید کل : 312190
تعداد مطالب : 143
تعداد نظرات : 1
تعداد آنلاین : 1